Goto

Collaborating Authors

 child architecture




Mitigating Forgetting in Online Continual Learning via Instance-A ware Parameterization (Supplemental) Hung-Jen Chen

Neural Information Processing Systems

Encourage controller to search unseen blocks by Eq. 9 Get reward r by Eq. 3 We conduct an ablation study to show the strength of count-based search exploration. We compare the performance difference between InstAParam with and without count-based exploration. Although, InstaNAS tries to solve the problem with "policy shuffling", we found that it does not solve the problem in this scenario. The detailed accuracy is listed in Table 2. CIFAR-10 and does not sacrifice the initial performance. First, we will focus on the distribution of the policy for each task.



InstaNAS: Instance-aware Neural Architecture Search

Cheng, An-Chieh, Lin, Chieh Hubert, Juan, Da-Cheng, Wei, Wei, Sun, Min

arXiv.org Machine Learning

Neural Architecture Search (NAS) aims at finding one "single" architecture that achieves the best accuracy for a given task such as image recognition.In this paper, we study the instance-level variation,and demonstrate that instance-awareness is an important yet currently missing component of NAS. Based on this observation, we propose InstaNAS for searching toward instance-level architectures;the controller is trained to search and form a "distribution of architectures" instead of a single final architecture. Then during the inference phase, the controller selects an architecture from the distribution, tailored for each unseen image to achieve both high accuracy and short latency. The experimental results show that InstaNAS reduces the inference latency without compromising classification accuracy. On average, InstaNAS achieves 48.9% latency reduction on CIFAR-10 and 40.2% latency reduction on CIFAR-100 with respect to MobileNetV2 architecture.